Approximation and Classification in Medicine with IncNet Neural Networks1
نویسنده
چکیده
Structure of incremental neural network (IncNet) is controlled by growing and pruning to match the complexity of training data. Extended Kalman Filter algorithm and its fast version is used as learning algorithm. Bi-central transfer functions, more flexible than other functions commonly used in artificial neural networks, are used. The latest improvement added is the ability to rotate the contours of constant values of transfer functions in multidimensional spaces with only N − 1 adaptive parameters. Results on approximation benchmarks and on the real world psychometric classification problem clearly shows superior generalization performance of presented network comparing with other classification models. INTRODUCTION Artificial Neural Networks (ANN) are used to many different kinds of problems such as classification, approximation, pattern recognition, signal processing, prediction, feature extraction, etc. Most of them are solved with ANN by learning of the mapping between the input and output space for given data sets S = {〈x1, y1〉, . . . , 〈xn, yn〉}, where 〈xi, yi〉 is input – output pair (xi ∈ R, yi ∈ R). The underlying mapping F(·) can be written as F(xi) = yi + η, i = 1, . . . , n (1) where η is a zero mean white noise with variance σns. Building a network that preserves information with complexity matched to training data, using an architecture which is able to grow, shrink, and using flexible transfer functions to estimate complex probability density distributions, is the goal of this paper. The best known local learning models are the radial basis function networks (RBF) (Powell, 1987; Poggio and Girosi, 1990; Bishop, 1991), adaptive kernel methods and local risk minimization (Girosi, 1998). The RBF networks were designed as a solution to an approximation problem in multi–dimensional spaces. The typical form of the RBF network can be written as
منابع مشابه
Approximation and classification with RBF-type Neural Networks using flexible local and semi-local transfer functions1
Structure of incremental neural network (IncNet) is controlled by growing and pruning to match the complexity of training data. Extended Kalman Filter algorithm and its fast version is used as learning algorithm. Bi-radial transfer functions, more flexible than other functions commonly used in artificial neural networks, are used. The latest improvement added is the ability to rotate the contou...
متن کاملApproximation and Classification in Medicine with IncNet Neural Networks
Structure of incremental neural network (IncNet) is controlled by growing and pruning to match the complexity of training data. Extended Kalman Filter algorithm and its fast version is used as learning algorithm. Bi-central transfer functions, more flexible than other functions commonly used in artificial neural networks, are used. The latest improvement added is the ability to rotate the conto...
متن کاملApproximation and classification with RBF-type Neural Networks using flexible local and semi-local transfer functions
Structure of incremental neural network (IncNet) is controlled by growing and pruning to match the complexity of training data. Extended Kalman Filter algorithm and its fast version is used as learning algorithm. Bi-radial transfer functions, more flexible than other functions commonly used in artificial neural networks, are used. The latest improvement added is the ability to rotate the contou...
متن کاملGDOP Classification and Approximation by Implementation of Time Delay Neural Network Method for Low-Cost GPS Receivers
Geometric Dilution of Precision (GDOP) is a coefficient for constellations of Global Positioning System (GPS) satellites. These satellites are organized geometrically. Traditionally, GPS GDOP computation is based on the inversion matrix with complicated measurement equations. A new strategy for calculation of GPS GDOP is construction of time series problem; it employs machine learning and artif...
متن کاملStatistical Control of Growing and Pruning in Rbf-like Neural Networks
Abstract: Incremental Net Pro (IncNet Pro) with local learning feature and statistically controlled growing and pruning of the network is introduced. The architecture of the net is based on RBF networks. Extended Kalman Filter algorithm and its new fast version is proposed and used as learning algorithm. IncNet Pro is similar to the Resource Allocation Network described by Platt in the main ide...
متن کامل